Discrete all-positive multilayer perceptrons for optical implementation
نویسندگان
چکیده
منابع مشابه
Discrete All-positive Multilayer Perceptrons for Optical Implementation Discrete All-positive Multilayer Perceptrons for Optical Implementation
All-optical multilayer perceptrons diier in various ways from the ideal neural network model. Examples are the use of non-ideal activation functions which are truncated, asymmetric, and have a non-standard gain, restriction of the network parameters to non-negative values, and the limited accuracy of the weights. In this paper, a backpropagation-based learning rule is presented that compensates...
متن کاملFunctional preprocessing for multilayer perceptrons
In many applications, high dimensional input data can be considered as sampled functions. We show in this paper how to use this prior knowledge to implement functional preprocessings that allow to consistently reduce the dimension of the data even when they have missing values. Preprocessed functions are then handled by a numerical MLP which approximates the theoretical functional MLP. A succes...
متن کاملEntropy Minimization Algorithm for Multilayer Perceptrons
We have previously proposed the use of quadratic Renyi’s error entropy with a Parzen density estimator with Gaussian kernels as an alternative optimality criterion for supervised neural network training, and showed that it produces better performance on the test data compared to the MSE. The error entropy criterion imposes the minimization of average information content in the error signal rath...
متن کاملQuantile regression with multilayer perceptrons
We consider nonlinear quantile regression involving multilayer perceptrons (MLP). In this paper we investigate the asymptotic behavior of quantile regression in a general framework. First by allowing possibly non-identifiable regression models like MLP's with redundant hidden units, then by relaxing the conditions on the density of the noise. In this paper, we present an universal bound for the...
متن کاملFast training of multilayer perceptrons
Training a multilayer perceptron by an error backpropagation algorithm is slow and uncertain. This paper describes a new approach which is much faster and certain than error backpropagation. The proposed approach is based on combined iterative and direct solution methods. In this approach, we use an inverse transformation for linearization of nonlinear output activation functions, direct soluti...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Optical Engineering
سال: 1998
ISSN: 0091-3286
DOI: 10.1117/1.601963